Encryption Wars
 
by Gode Davis, Web Server Online Magazine (09/1998)

 
Engaged in a tug-of-war between the vested interests of encryption software developers and law enforcement, the Clinton Administration is slowly giving ground to the commercial sector. Meanwhile, the war being waged by academic cryptographers and civil libertarians to make 'strong' encryption freely available is something else again.


Considering the vast amount of information trafficking the World Wide Web, it's amazing just how little content is camouflaged in code. Barring the occasional encrypted messages found in newsgroups, the relatively few code-rich Web pages maintained by crypto-hackers and their ilk or the increasing number of pages reserved by large financial institutions for credit card transactions, security on the Internet is seldom assigned priority. Even sensitive communications emailed between commercial interests are all too often insecure--protected by so-called "weak" algorithms if encrypted at all. Given all the encryption technology that exists, why isn't it more widely available and deployed?

In fact, mostly out of the mainstream's eye, two distinctly different encryption wars are being fought. The course of either could determine how encryption is eventually perceived. One embraces the rights of individuals to post encryption algorithms on the Internet or to more readily access software capable of generating strong (64- to 128-bit) encryption for their personal needs. The other is a fundamental conflict between business and government over just how "encrypted" the Internet should be.

Peter D. Junger's fight concerns constitutional cryptography issues involving individuals and the First Amendment. In August 1996, Junger, a professor of law at Case Western Reserve University Law School, Cleveland, OH, filed suit to challenge the then controversial International Traffic in Arms Regulation (ITAR) export codes, in effect, saying they hindered his ability to teach a cryptography course. Because foreign students were attending his classes, he wanted to publish his class materials on the Internet. Junger's materials included cryptographic software source code that the Office of Defense Trade Controls, the U.S. Department of Commerce agency charged with enforcing ITAR, originally saw fit to classify as "munitions" along with chemical or biological weapons, tanks, heavy artillery, fire-control radar and military aircraft. Under ITAR, the mere posting of proscribed software cryptography on the Internet was construed as perpetrating an illegal export--a crime subject to fines of up to $1 million or a maximum prison term of 10 years.

By the time Junger's case reached its denouement, nearly two years had elapsed. ITAR had been replaced (on December 30, 1996) by the slightly less stringent Export Administration Regulation (EAR). Under EAR, Junger's classroom materials would no longer be considered "dual-use" cryptography (that is, cryptography that can serve both civilian and military purposes) and placed on a munitions list. Instead, they would be reclassified as an illegal form of encryption item--still odious to the government, but perhaps not quite so (because Junger didn't actually post anything illegal, he was never charged with a crime).

On July 3, 1998, Judge James Gwin of the Federal District Court for the Northern District of Ohio issued an opinion in the case of Junger vs. Daley (U.S. Secretary of Commerce William A. Daley), granting summary judgment for the government and holding that computer software is not constitutionally protected speech because it is "inherently functional." In his decision, Gwin wrote, "Source code is purely functional in a way that instructions, manuals and recipes are not. Unlike instructions, a manual or a recipe, source code actually performs the function it describes. While a recipe provides instructions to a cook, source code is a device, like embedded circuitry in a telephone, that actually does the function of encryption."

According to Junger, Gwin's "embedded circuitry" simile would have struck many programmers as "magnificently incoherent." According to Junger, "Judge Gwin wrote a very clear opinion, he did his job, but he got it gloriously wrong." Junger will appeal. Yet even as his case and the sundry skirmishes of other academic cryptographers occasionally occur in legal circles, their ideological shadow hasn't managed to engulf commercial interests. "The Junger case, and other free speech arguments posed by individuals, while possibly of great constitutional weight, simply aren't perceived as a factor in the business/government equation," says Seth Finkelstein, a Massachusetts Institute of Technology alumnus, who describes himself as a longtime electronic activist and Internet expert with a predilection for encryption matters. "There are various players with agendas that do not match up. The goal of a business is to make a profit, not to protect freedom."

Although marked disinterest is typically displayed by encryption product vendors toward cases like Junger's, the U.S. government's stubborn will to enforce encryption export policies draws consistent fire from the industry. Current federal law bans the export of cryptographic technology that relies on keys 56 bits or higher (until January 1997, the limit was just 40 bits). While U.S. companies may not export the ability to create encrypted files exceeding 56 bits, such files may be decrypted, or unscrambled, outside the United States with no limits. "The U.S. government is concerned with the export of the ability to create 'uncrackable' files, not to read them," says Dean Compoginis, marketing manager for Aladdin Systems Inc., Watsonville, CA, a maker of encryption products.

However, others believe law enforcement agencies vehemently oppose the proliferation of any files they can't crack. "U.S. signals intelligence and law enforcement special interest groups like the NSA [National Security Agency], FBI [Federal Bureau of Investigations] and FINCEN [Financial Crimes Enforcement Network] have vested interests in being able to scan the U.S. public's email and the foreign public's email," according to Adam Back, a U.K.-based crypto-activist, cryptographer and self-proclaimed "cypherpunk." In fact, according to Back, as well as Whitfield Diffie, distinguished scientist at Sun Microsystems Inc., Palo Alto, CA, who along with electrical engineer Martin Hellman discovered public-key cryptography in 1976 (a milestone in the field), law enforcement interests might not be the most compelling reason for the U.S. government's systematic suppression of cryptography development domestically. "I am almost alone in believing that the driving force is signals intelligence and that the 'law enforcement issue' is a smoke screen because it is easier to say that you want to enforce the law than that you want to break the laws of other countries by spying on their communications," Diffie says.

Police agencies argue that export limits are necessary to allow them to gain access to communications among terrorists (as they define them), drug dealers, child pornographers or those involved in criminal conspiracies. FINCEN is downright fearful of anonymous electronic payment systems being perverted for evil purposes, including money laundering. On August 30, 1997, FBI Director Louis Freeh hinted during a congressional hearing that his agency would like to go a step further and have encryption limits placed on domestic products as well. According to Bert-Jaap Koops, a doctoral student at Eindhoven University of Technology in The Netherlands and author of the "Crypto Law Survey" (a regularly updated report on international legislation related to cryptography, which can be found on his Web site), such a prohibition would place the United States in the same company as Iran, Iraq and the People's Republic of China. But sensing an immediate backlash, the Clinton Administration backed away from Freeh's remarks.

Speaking at the 13th Annual Software Publishers Association Conference in Washington, DC, on September 9, 1997, U.S. Vice President Al Gore summed up U.S. policy: "The government must be wary of suffocating [the encryption software] industry with regulation in the new digital age, but we must be able to strike a balance between the legitimate concerns of the law enforcement community and the needs of the marketplace."

On the flip side, the Software Publishers Association (SPA) and other U.S. trade organizations for encryption product vendors, including the Computer Systems Policy Project (CSPP) and Americans for Computer Privacy (ACP), argue that strong encryption technology is already widely available in other countries and that limits are crippling security software development in the United States.

In fact, industry insiders can point to legitimate reasons why U.S. businesses need more encryption, not less. "The ability to encrypt sensitive files is important to legal, accounting and financial professionals, as well as anyone who wants to protect their private information from prying eyes," says Aladdin's Compoginis. His firm makes 128-bit versions of its Private File encryption software for domestic use and a 40-bit version for export outside the United States. "In this global economy, many U.S.-based companies have partners and field offices all over the world. Having the ability to confidently transmit sensitive information into these countries is a primary need for most multinational companies," Compoginis says.

Sun's Diffie expresses a similar sentiment: "We want the export controls relaxed so that we can serve the legitimate needs of our customers in other parts of the world for secured network computing." According to Diffie, an unfettered encryption climate established for American commercial interests would not only lead to better security for electronic commerce but would serve to enhance security for the private communications of U.S. and foreign citizens and for the U.S. critical infrastructure.

As far as the industry is concerned, it would be bad enough if export limits were the only tenet of the U.S. government's encryption policy. Except that it's not; far from it. The backbone of U.S. encryption policy is what Back refers to as "Government Access to Communications Keys (GACK)," or key recovery to phrase it more elegantly. Back and fellow cypherpunks, Andy Brown and Piete Brooks, defined the essence of what a cryptographic key is in a 1995 paper they coauthored about a key-discovery method called the Simple Key Search Protocol (SKSP). "The security of an [encryption] algorithm lies entirely in the key that it uses. Without the key, an attacker is left with the task of trying every possible key until one is found that decrypts the data." Key recovery would require U.S. vendors to put keys to the encryption code they develop in the hands of "trusted" third parties, whereby the keys could be retrieved by law enforcement officials if illegal activity is suspected.

"Key escrow"--the act of trusting a third party with an algorithm's key--has been part of the Clinton Administration's encryption policy mix since January 1997. As it stands, export of 56-bit cryptography is allowed if the keys are escrowed within two years; stronger cryptography can be exported if it is escrowed immediately. The risks inherent to key escrow encryption technology include stolen or compromised keys. In fact, according to "The Risks of Key Recovery, Key Escrow, and Trusted Third Party Encryption," a report authored by 11 ad hoc cryptographers and computer scientists published in October by the Washington, DC-based, nonprofit Center For Democracy and Technology, creation of centralized storage locations for the keys used to encrypt data, such as key recovery centers, are likely to draw criminal attacks. Such weakening factors serve to defeat the purpose of encryption security so that many U.S. firms don't even bother to encrypt sensitive files. What's more, according to Back, key recovery schemes don't even work. "Two people acting in concert can always send messages that circumvent key recovery," Back says.

It gets worse. According to the Electronic Frontier Foundation (EFF), a San Francisco, CA-based watchdog group, the U.S. government has deliberately misled the public about the Data Encryption Standard, or DES. Adopted as a federal standard in 1977 to protect unclassified communications and data, DES uses a fixed-size key of 56 bits, meaning a user must employ precisely the right combination of 56 ones and zeros to decode information correctly. Until recently, the government had pressed the industry to limit encryption to DES (and even weaker forms), without revealing how easy it might be to crack. Although stronger algorithms, such as domestic versions of Aladdin's Private File and the Santa Clara, CA-based Network Associates Inc.'s Pretty Good Privacy (PGP), are readily available to domestic users, their widespread use has been strongly discouraged, called "injurious to national security" and even characterized by some U.S. government officials as "excessive" use of encryption.

When DES was inevitably cracked this summer, it wasn't by some gargantuan megacorporation with a myriad of resources, but by the EFF, a modest-size nonprofit, pro-privacy civil liberties organization using its so-called "DES Cracker" machine--a rather ordinary but dedicated piece of hardware built for less than $250,000. According to the jubilant EFF press release issued July 17, finding the right 56-bit key took less than three days. Although it had long ceased to be a secure algorithm, Trusted Information Systems Inc., Glenwood, MD, a global provider of security solutions for enterprise networks, reported in December that DES could be found in 281 foreign and 466 domestic encryption products--roughly one-third to a half of the entire market--and so cracking the code will probably add urgency to ongoing efforts to create the Advanced Encryption Standard (AES), a new government-fostered replacement standard. The process of adopting AES would take at least two years, and likely much longer. Why so slow? Largely because even the new encryption standard would have "GACK" capabilities built into it--if not key recovery, perhaps a "backdoor" access field. One example of such a field was introduced by the Clinton Administration in 1994. Called Law Enforcement Access Field (LEAF), it was transmitted along with each encrypted message (an algorithm within an algorithm) and contained information identifying each encryption key used.

Ah, but with summer past, those leaves might soon be falling. On May 12, Senator John Ashcroft (R-MO) and Sen. Patrick J. Leahy (D-VT) introduced their "E-Privacy" Act. The bill outlined a pro-privacy approach to computer security that would, if passed into law, "protect the domestic use of strong encryption without key recovery or other backdoors for government eavesdropping, ease export controls to allow U.S. companies to sell their encryption products overseas, strengthen protections from government access to decryption keys and create unprecedented new protections for data stored in networks and cell phone location information." While this new bill still probably wouldn't protect freedoms for the likes of Junger, it could go a long way toward relaxing tensions in the business-government encryption war.